Conference Proceedings
A Neural Network Model for Low-Resource Universal Dependency Parsing
L Duong, T Cohn, S BIRD, P Cook
The Association for Computational Linguistics | Published : 2015
DOI: 10.18653/v1/d15-1040
Abstract
Accurate dependency parsing requires large treebanks, which are only available for a few languages. We propose a method that takes advantage of shared structure across languages to build a mature parser using less training data. We propose a model for learning a shared "universal" parser that operates over an interlingual continuous representation of language, along with language-specific mapping components. Compared with supervised learning, our methods give a consistent 8-10% improvement across several treebanks in low-resource simulations.